skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 10:00 PM ET on Friday, February 6 until 10:00 AM ET on Saturday, February 7 due to maintenance. We apologize for the inconvenience.


Search for: All records

Creators/Authors contains: "Gentine, Pierre"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Precise and reliable climate projections are required for climate adaptation and mitigation, but Earth system models still exhibit great uncertainties. Several approaches have been developed to reduce the spread of climate projections and feedbacks, yet those methods cannot capture the nonlinear complexity inherent in the climate system. Using a Transfer Learning approach, we show that Machine Learning can be used to optimally leverage and merge the knowledge gained from global temperature maps simulated by Earth system models and observed in the historical period to reduce the spread of global surface air temperature fields projected in the 21st century. We reach an uncertainty reduction of more than 50% with respect to state-of-the-art approaches while giving evidence that our method provides improved regional temperature patterns together with narrower projections uncertainty, urgently required for climate adaptation. 
    more » « less
  2. Abstract The increasing size and severity of wildfires across the western United States have generated dangerous levels of PM2.5concentrations in recent years. In a changing climate, expanding the use of prescribed fires is widely considered to be the most robust fire mitigation strategy. However, reliably forecasting the potential air quality impact from prescribed fires, which is critical in planning the prescribed fires’ location and time, at hourly to daily time scales remains a challenging problem. In this paper, we introduce a spatio-temporal graph neural network (GNN)-based forecasting model for hourly PM2.5predictions across California. Utilizing a two-step approach, we use our forecasting model to predict the net and ambient PM2.5concentrations, which are used to estimate wildfire contributions. Integrating the GNN-based PM2.5forecasting model with simulations of historically prescribed fires, we propose a novel framework to forecast their air quality impact. This framework determines that March is the optimal month for implementing prescribed fires in California and quantifies the potential air quality trade-offs involved in conducting more prescribed fires outside the peak of the fire season. 
    more » « less
  3. Abstract Extreme winds associated with tropical cyclones (TCs) can cause significant loss of life and economic damage globally, highlighting the need for accurate, high‐resolution modeling and forecasting for wind. However, due to their coarse horizontal resolution, most global climate and weather models suffer from chronic underprediction of TC wind speeds, limiting their use for impact analysis and energy modeling. In this study, we introduce a cascading deep learning framework designed to downscale high‐resolution TC wind fields given low‐resolution data. Our approach maps 85 TC events from ERA5 data (0.25° resolution) to high‐resolution (0.05° resolution) observations at 6‐hr intervals. The initial component is a debiasing neural network designed to model accurate wind speed observations using ERA5 data. The second component employs a generative super‐resolution strategy based on a conditional denoising diffusion probabilistic model (DDPM) to enhance the spatial resolution and to produce ensemble estimates. The model is able to accurately model intensity and produce realistic radial profiles and fine‐scale spatial structures of wind fields, with a percentage mean bias of −3.74% compared to the high‐resolution observations. Our downscaling framework enables the prediction of high‐resolution wind fields using widely available low‐resolution and intensity wind data, allowing for the modeling of past events and the assessment of future TC risks. 
    more » « less
  4. Abstract The parameterization of suspended sediments in vegetated flows presents a significant challenge, yet it is crucial across various environmental and geophysical disciplines. This study focuses on modeling suspended sediment concentrations (SSC) in vegetated flows with a canopy density ofavH ∈ [0.3, 1.0] by examining turbulent dispersive flux. While conventional studies disregard dispersive momentum flux foravH> 0.1, our findings reveal significant dispersive sediment flux for large particles with a diameter‐to‐Kolmogorov length ratio whendp/η > 0.1. Traditional Rouse alike approaches therefore must be revised to account for this effect. We introduce a hybrid methodology that combines physical modeling with machine learning to parameterize dispersive flux, guided by constraints from diffusive and settling fluxes, characterized using recent covariance and turbulent settling methods, respectively. The model predictions align well with reported SSC data, demonstrating the versatility of the model in parameterizing sediment‐vegetation interactions in turbulent flows. 
    more » « less
  5. Obtaining accurate and dense three-dimensional estimates of turbulent wall-bounded flows is notoriously challenging, and this limitation negatively impacts geophysical and engineering applications, such as weather forecasting, climate predictions, air quality monitoring, and flow control. This study introduces a physics-informed variational autoencoder model that reconstructs realizable three-dimensional turbulent velocity fields from two-dimensional planar measurements thereof. Physics knowledge is introduced as soft and hard constraints in the loss term and network architecture, respectively, to enhance model robustness and leverage inductive biases alongside observational ones. The performance of the proposed framework is examined in a turbulent open-channel flow application at friction Reynolds number Reτ=250. The model excels in precisely reconstructing the dynamic flow patterns at any given time and location, including turbulent coherent structures, while also providing accurate time- and spatially-averaged flow statistics. The model outperforms state-of-the-art classical approaches for flow reconstruction such as the linear stochastic estimation method. Physical constraints provide a modest but discernible improvement in the prediction of small-scale flow structures and maintain better consistency with the fundamental equations governing the system when compared to a purely data-driven approach. 
    more » « less